699 research outputs found
Explosive Percolation in the Human Protein Homology Network
We study the explosive character of the percolation transition in a
real-world network. We show that the emergence of a spanning cluster in the
Human Protein Homology Network (H-PHN) exhibits similar features to an
Achlioptas-type process and is markedly different from regular random
percolation. The underlying mechanism of this transition can be described by
slow-growing clusters that remain isolated until the later stages of the
process, when the addition of a small number of links leads to the rapid
interconnection of these modules into a giant cluster. Our results indicate
that the evolutionary-based process that shapes the topology of the H-PHN
through duplication-divergence events may occur in sudden steps, similarly to
what is seen in first-order phase transitions.Comment: 13 pages, 6 figure
Matchings on infinite graphs
Elek and Lippner (2010) showed that the convergence of a sequence of
bounded-degree graphs implies the existence of a limit for the proportion of
vertices covered by a maximum matching. We provide a characterization of the
limiting parameter via a local recursion defined directly on the limit of the
graph sequence. Interestingly, the recursion may admit multiple solutions,
implying non-trivial long-range dependencies between the covered vertices. We
overcome this lack of correlation decay by introducing a perturbative parameter
(temperature), which we let progressively go to zero. This allows us to
uniquely identify the correct solution. In the important case where the graph
limit is a unimodular Galton-Watson tree, the recursion simplifies into a
distributional equation that can be solved explicitly, leading to a new
asymptotic formula that considerably extends the well-known one by Karp and
Sipser for Erd\"os-R\'enyi random graphs.Comment: 23 page
The early evolution of the H-free process
The H-free process, for some fixed graph H, is the random graph process
defined by starting with an empty graph on n vertices and then adding edges one
at a time, chosen uniformly at random subject to the constraint that no H
subgraph is formed. Let G be the random maximal H-free graph obtained at the
end of the process. When H is strictly 2-balanced, we show that for some c>0,
with high probability as , the minimum degree in G is at least
. This gives new lower bounds for
the Tur\'an numbers of certain bipartite graphs, such as the complete bipartite
graphs with . When H is a complete graph with we show that for some C>0, with high probability the independence number of
G is at most . This gives new lower bounds
for Ramsey numbers R(s,t) for fixed and t large. We also obtain new
bounds for the independence number of G for other graphs H, including the case
when H is a cycle. Our proofs use the differential equations method for random
graph processes to analyse the evolution of the process, and give further
information about the structure of the graphs obtained, including asymptotic
formulae for a broad class of subgraph extension variables.Comment: 36 page
Conceptualizing throughput legitimacy: procedural mechanisms of accountability, transparency, inclusiveness and openness in EU governance
This symposium demonstrates the potential for throughput legitimacy as a concept for shedding empirical light on the strengths and weaknesses of multi-level governance, as well as challenging the concept theoretically. This article introduces the symposium by conceptualizing throughput legitimacy as an ‘umbrella concept’, encompassing a constellation
of normative criteria not necessarily empirically interrelated. It argues that in order to interrogate multi-level governance processes in all their complexity, it makes sense for us to develop normative standards that are not naïve about the empirical realities of how power is exercised within multilevel governance, or how it may interact with legitimacy. We argue that while throughput legitimacy has its normative limits, it can be substantively useful for these purposes. While being no replacement for input and output legitimacy, throughput legitimacy offers distinctive normative criteria— accountability, transparency, inclusiveness and openness— and points towards substantive institutional reforms.Published versio
Smoothed Complexity Theory
Smoothed analysis is a new way of analyzing algorithms introduced by Spielman
and Teng (J. ACM, 2004). Classical methods like worst-case or average-case
analysis have accompanying complexity classes, like P and AvgP, respectively.
While worst-case or average-case analysis give us a means to talk about the
running time of a particular algorithm, complexity classes allows us to talk
about the inherent difficulty of problems.
Smoothed analysis is a hybrid of worst-case and average-case analysis and
compensates some of their drawbacks. Despite its success for the analysis of
single algorithms and problems, there is no embedding of smoothed analysis into
computational complexity theory, which is necessary to classify problems
according to their intrinsic difficulty.
We propose a framework for smoothed complexity theory, define the relevant
classes, and prove some first hardness results (of bounded halting and tiling)
and tractability results (binary optimization problems, graph coloring,
satisfiability). Furthermore, we discuss extensions and shortcomings of our
model and relate it to semi-random models.Comment: to be presented at MFCS 201
Amyloid Deposition in Transplanted Human Pancreatic Islets: A Conceivable Cause of Their Long-Term Failure
Following the encouraging report of the Edmonton group, there was a rejuvenation of the islet transplantation field. After that, more pessimistic views spread when long-term results of the clinical outcome were published. A progressive loss of the β-cell function meant that almost all patients were back on insulin therapy after 5 years. More than 10 years ago, we demonstrated that amyloid deposits rapidly formed in human islets and in mouse islets transgenic for human IAPP when grafted into nude mice. It is, therefore, conceivable to consider amyloid formation as one potential candidate for the long-term failure. The present paper reviews attempts in our laboratories to elucidate the dynamics of and mechanisms behind the formation of amyloid in transplanted islets with special emphasis on the impact of long-term hyperglycemia
Recommended from our members
On the source, site and modes of domination
This article seeks to examine how domination manifests in social relationships and institutions. It does this by examining two debates in republican literature. The first of which is whether domination requires institutionalisation? This addresses the source of domination. The second debate is on the nature of arbitrary power. This raises questions about the site of domination. It will be argued that the source of domination can be personally or socially constituted and that the site can be interactional or systemic. This yields four modes of domination that can be used to examine social institutions and relationships
Utilizing volatile organic compounds for early detection of Fusarium circinatum
Acknowledgements This study was financially supported by The Swedish Research Council Formas, Grant #2018-00966, Crafoordska stiftelsen Grant #20200631, Carl Tryggers Stiftelse för Vetenskaplig Forskning Grant 18:67, The Royal Swedish Academy of Agriculture and Forestry, Stiftelsen fonden för skogsvetenskaplig forskning, Erasmus+ Staff mobility grant, Anna-Britta & Vadim Söderströms resestipendium and NordGen Forest SNS scholarships. J.N.S. was supported by The European Union’s Horizon Europe research and innovation programme under the MSCA agreement No 101068728. Thanks to dr. R.R. Vetukuri for providing F. graminearum, to the staff of Laboratorio de Técnicas Instrumentales, Universidad de Valladolid, for providing access to lab facilities and to J-E. Englund for assistance in making the experimental design. Funding Open access funding provided by Swedish University of Agricultural Sciences.Peer reviewedPublisher PD
Improved limit on the directly measured antiproton lifetime
Continuous monitoring of a cloud of antiprotons stored in a Penning trap for 405 days enables us to set an improved limit on the directly measured antiproton lifetime. From our measurements we extract a storage time of 3.15x108 equivalent antiproton-seconds, resulting in a lower lifetime limit of Tp > 10.2,a with a confidence level of 68%. This result improves the limit on charge-parity-time violation in antiproton decays based on direct observation by a factor of 7
- …